ShiftDelete.Net Global

Tension is rising: Photographers declared war on artificial intelligence!

Ana sayfa / AI

Generative artificial intelligence is slowly settling into every aspect of our lives. However, it needs lots of data to train its models. Professional artists, on the other hand, are uncomfortable with productive artificial intelligence companies using their work to train their own technologies. Photographers will soon be able to provide an effective response to this issue that does not require going to court. Here are the details…

Tension between professional artists and artificial intelligence is rising!

Generative AI burst onto the scene almost a year ago. With the release of OpenAI’s ChatGPT chatbot, it has now managed to touch almost every aspect of our lives. ChatGPT is extremely adept at communicating in a very natural, human-like way. However, in order to gain this ability, it needs to be trained on masses of data from the web.

Similar generative AI tools are also capable of generating images from texts. However, like ChatGPT, these are trained by scraping images found on the web. This situation brings with it completely different problems.

United Nations creates artificial intelligence advisory body!

Artificial intelligence is the rising star of our age. The United Nations is also preparing to take a step towards the trend technology that surrounds the world.

Unfortunately, this is evidence that the works of artists and photographers are being used as resources for training artificial intelligence. Additionally, companies are using these studies without permission or compensation to develop their own productive AI tools.

Professional artists are quite disturbed by this situation. They are working to take precautions. To combat this, a team of researchers developed a tool called Nightshade, which can confuse the training model and cause it to output erroneous images in response to prompts. A tool called Nightshade adds invisible pixels to a work of art before it is uploaded to the web. Thus, it poisons the training data of artificial intelligence.

Using this tool to poison this training data will harm future iterations of image-generating AI models like DALL-E, Midjourney, and Stable Diffusion. It may even render some of its outputs useless. With this vehicle, dogs turn into cats and cars turn into cows.

According to University of Chicago professor Ben Zhao, who led the research team behind Nightshade, the tool will help shift the balance of power back to artists. Additionally, the professor said he could fire a warning shot at tech companies that ignore copyright and intellectual property.

Yorum Ekleyin